Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
5 - 10 Lacs
Thiruvananthapuram / Trivandrum, Kerala, India
On-site
Snowflake Data Warehouse Development: Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflake's features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development: Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark: Leverage PySpark for data transformations within the Snowflake environment. Implement complex data cleansing, enrichment, and validation processes using PySpark to ensure the highest data quality. Collaboration: Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.
Posted 1 month ago
5.0 - 10.0 years
5 - 10 Lacs
Cochin / Kochi / Ernakulam, Kerala, India
On-site
Snowflake Data Warehouse Development: Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflake's features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development: Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark: Leverage PySpark for data transformations within the Snowflake environment. Implement complex data cleansing, enrichment, and validation processes using PySpark to ensure the highest data quality. Collaboration: Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.
Posted 1 month ago
6.0 - 11.0 years
6 - 11 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.
Posted 1 month ago
6.0 - 11.0 years
6 - 11 Lacs
Pune, Maharashtra, India
On-site
Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.
Posted 1 month ago
6.0 - 11.0 years
6 - 11 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.
Posted 1 month ago
11.0 - 21.0 years
11 - 21 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Cloud knowledge, especially AWS knowledge is necessary Well-versed with best practices around Data Governance, Data Stewardship and overall Data Quality initiatives Inventories existing data design, including data flows and systems, Designs data Model that integrates new and existing environment, Develops Conducts architecture meetings with Team leads and Solution Architects to communicate complex ideas, issues, and system processes along with Architecture discussions in their current projects Design the data warehouse and provide guidance to the team in implementation using Snowflake SnowSQL. Good hands-on experience in converting Source Independent Load, Post Load Process, Stored Procedure, SQL to Snowflake Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the snow pipelines and should be able to trouble shoot the issue quickly 1 yr Work experience in DBT. Data modeling and data integration. Advance SQL skills for analysis, standardizing queries Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse.
Posted 1 month ago
11.0 - 21.0 years
11 - 21 Lacs
Pune, Maharashtra, India
On-site
Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Cloud knowledge, especially AWS knowledge is necessary Well-versed with best practices around Data Governance, Data Stewardship and overall Data Quality initiatives Inventories existing data design, including data flows and systems, Designs data Model that integrates new and existing environment, Develops Conducts architecture meetings with Team leads and Solution Architects to communicate complex ideas, issues, and system processes along with Architecture discussions in their current projects Design the data warehouse and provide guidance to the team in implementation using Snowflake SnowSQL. Good hands-on experience in converting Source Independent Load, Post Load Process, Stored Procedure, SQL to Snowflake Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the snow pipelines and should be able to trouble shoot the issue quickly 1 yr Work experience in DBT. Data modeling and data integration. Advance SQL skills for analysis, standardizing queries Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse.
Posted 1 month ago
14.0 - 24.0 years
14 - 24 Lacs
Mumbai, Maharashtra, India
On-site
Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 1 month ago
14.0 - 24.0 years
14 - 24 Lacs
Pune, Maharashtra, India
On-site
Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 1 month ago
14.0 - 24.0 years
14 - 24 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 1 month ago
14.0 - 24.0 years
14 - 24 Lacs
Noida, Uttar Pradesh, India
On-site
Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 1 month ago
14.0 - 24.0 years
14 - 24 Lacs
Chennai, Tamil Nadu, India
On-site
Cloud Data Architect Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.
Posted 1 month ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
, India
On-site
About the Role: 10 One of the most valuable asset in today's Financial industry is the data which can provide businesses the intelligence essential to making business and financial decisions with conviction. This role will provide an opportunity to you to work on Ratings and Research related data. You will get an opportunity to work on cutting edge big data technologies and will be responsible for development of both Data feeds as well as API work. The Team: RatingsXpress is at the heart of financial workflows when it comes to providing and analyzing data. We provide Ratings and Research information to clients . Our work deals with content ingestion, data feeds generation as well as exposing the data to clients via API calls. This position in part of the Ratings Xpresss team and is focused on providing clients the critical data they need to make the most informed investment decisions possible. Impact: As a member of the Xpressfeed Team in S&P Global Market Intelligence, you will work with a group of intelligent and visionary engineers to build impactful content management tools for investment professionals across the globe. Our Software Engineers are involved in the full product life cycle, from design through release. You will be expected to participate in application designs , write high-quality code and innovate on how to improve the overall system performance and customer experience. If you are a talented developer and want to help drive the next phase for Data Management Solutions at S&P Global and can contribute great ideas, solutions and code and understand the value of Cloud solutions, we would like to talk to you. What's in it for you: We are currently seeking a Software Developer with a passion for full-stack development. In this role, you will have the opportunity to work on cutting-edge cloud technologies such as Databricks , Snowflake , and AWS , while also engaging in Scala and SQL Server -based database development. This position offers a unique opportunity to grow both as a Full Stack Developer and as a Cloud Engineer , expanding your expertise across modern data platforms and backend development. Responsibilities: Analyze, design and develop solutions within a multi-functional Agile team to support key business needs for the Data feeds Design, implement and test solutions using AWS EMR for content Ingestion. Work on complex SQL server projects involving high volume data Engineer components, and common services based on standard corporate development models, languages and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. Basic Qualifications: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. 3-6 years of experience in application development. Minimum of 2 years of hands-on experience with Scala. Minimum of 2 years of hands-on experience with Microsoft SQL Server. Solid understanding of Amazon Web Services (AWS) and cloud-based development. In-depth knowledge of system architecture, object-oriented programming, and design patterns. Excellent communication skills, with the ability to convey complex ideas clearly both verbally and in writing. Preferred Qualifications: Familiarity with AWS Services, EMR, Auto scaling, EKS Working knowledge of snowflake. Preferred experience in Python development. Familiarity with the Financial Services domain and Capital Markets is a plus. Experience developing systems that handle large volumes of data and require high computational performance. What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority - Ratings - (Strategic Workforce Planning)
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
, India
On-site
About the Role: 10 One of the most valuable asset in today's Financial industry is the data which can provide businesses the intelligence essential to making business and financial decisions with conviction. This role will provide an opportunity to you to work on Ratings and Research related data. You will get an opportunity to work on cutting edge big data technologies and will be responsible for development of both Data feeds as well as API work. The Team: RatingsXpress is at the heart of financial workflows when it comes to providing and analyzing data. We provide Ratings and Research information to clients . Our work deals with content ingestion, data feeds generation as well as exposing the data to clients via API calls. This position in part of the Ratings Xpresss team and is focused on providing clients the critical data they need to make the most informed investment decisions possible. Impact: As a member of the Xpressfeed Team in S&P Global Market Intelligence, you will work with a group of intelligent and visionary engineers to build impactful content management tools for investment professionals across the globe. Our Software Engineers are involved in the full product life cycle, from design through release. You will be expected to participate in application designs , write high-quality code and innovate on how to improve the overall system performance and customer experience. If you are a talented developer and want to help drive the next phase for Data Management Solutions at S&P Global and can contribute great ideas, solutions and code and understand the value of Cloud solutions, we would like to talk to you. What's in it for you: We are currently seeking a Software Developer with a passion for full-stack development. In this role, you will have the opportunity to work on cutting-edge cloud technologies such as Databricks , Snowflake , and AWS , while also engaging in Scala and SQL Server -based database development. This position offers a unique opportunity to grow both as a Full Stack Developer and as a Cloud Engineer , expanding your expertise across modern data platforms and backend development. Responsibilities: Analyze, design and develop solutions within a multi-functional Agile team to support key business needs for the Data feeds Design, implement and test solutions using AWS EMR for content Ingestion. Work on complex SQL server projects involving high volume data Engineer components, and common services based on standard corporate development models, languages and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. Basic Qualifications: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. 3-6 years of experience in application development. Minimum of 2 years of hands-on experience with Scala. Minimum of 2 years of hands-on experience with Microsoft SQL Server. Solid understanding of Amazon Web Services (AWS) and cloud-based development. In-depth knowledge of system architecture, object-oriented programming, and design patterns. Excellent communication skills, with the ability to convey complex ideas clearly both verbally and in writing. Preferred Qualifications: Familiarity with AWS Services, EMR, Auto scaling, EKS Working knowledge of snowflake. Preferred experience in Python development. Familiarity with the Financial Services domain and Capital Markets is a plus. Experience developing systems that handle large volumes of data and require high computational performance. What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority - Ratings - (Strategic Workforce Planning)
Posted 1 month ago
8.0 - 14.0 years
15 - 30 Lacs
Mangalore, Karnataka, India
On-site
Job Title: Snowflake & SQL Developer Location: Bangalore/Mangalore Type: Full-Time Experience: 8+ years Why MResult Founded in 2004, MResult is a global digital solutions partner trusted by leading Fortune 500 companies in industries such as pharma & healthcare, retail, and BFSI. MResult's expertise in data and analytics, data engineering, machine learning, AI, and automation help companies streamline operations and unlock business value. As part of our team, you will collaborate with top minds in the industry to deliver cutting-edge solutions that solve real-world challenges. Website: https://mresult.com/ LinkedIn: https://www.linkedin.com/company/mresult/ What We Offer: At MResult, you can leave your mark on projects at the world's most recognized brands, access opportunities to grow and upskill, and do your best work with the flexibility of hybrid work models. Great work is rewarded, and leaders are nurtured from within. Our values Agility, Collaboration, Client Focus, Innovation, and Integrity are woven into our culture, guiding every decision. What This Role Requires In the role of Snowflake & SQL Developer, you will be a key contributor to MResult's mission of empowering our clients with data-driven insights and innovative digital solutions. Each day brings exciting challenges and growth opportunities. Here is what you will do: Roles and responsibilities: Overall 8+ years of experience and 5+ years of relevant experience in Snowflake and PostgreSQL. Design, develop, and optimize data pipelines and batch processes using PostgreSQL and Snowflake. Maintain and enhance the rules-based engine that drives territory alignment logic, ensuring scalability and performance. Troubleshoot and resolve data integration issues. Manage, Master, and Maximize with MResult MResult is an equal-opportunity employer committed to building an inclusive environment free of discrimination and harassment. Take the next step in your career with MResult where your ideas help shape the future.
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
, India
On-site
At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position In Roche Informatics, we build on Roche's 125-year history as one of the world's largest biotech companies, globally recognized for providing transformative innovative solutions across major disease areas. We combine human capabilities with cutting-edge technological innovations to do now what our patients need next. Our commitment to our patients needs motivates us to deliver technology that evolves the practice of medicine. Be part of our inclusive team at Roche Informatics, where we're driven by a shared passion for technological novelties and optimal IT solutions. About the position Data Engineer, who will work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. In this position, you will require hands-on expertise in ETL pipeline development, data engineering. You should also be able to provide direction and guidance to developers, oversee the development and unit testing, as well as document the developed solution. Building strong customer relationships for ongoing business is also a key aspect of this role. To succeed in this position, you should have experience with Cloud-based Data Solution Architectures, the Software Development Life Cycle (including both Agile and waterfall methodologies), Data Engineering and ETL tools/platforms, and data modeling practices. Your key responsibilities: Building and optimizing data ETL pipelines to support data analytics Developing and implementing data integrations with other systems and platforms Maintaining documentation for data pipelines and related processes Logical and physical modeling of datasets and applications Making Roche data assets accessible and findable across the organization Explore new ways of building, processing, and analyzing data in order to deliver insights to our business partners Continuously refine data quality with testing, tooling and performance evaluation Work with business and functional stakeholders to understand data requirements and downstream analytics needs. Partner with business to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements. Foster a data-driven culture throughout the team and lead data engineering projects that will have an impact throughout the organization. Work with data and analytics experts to strive for greater functionality in our data systems and products and help to grow our data team with exceptional engineers. Your qualifications and experience: Education in related fields (Computer Science, Computer Engineering, Mathematical Engineering, Information Systems) or job experience preferably within multiple Data Engineering technologies. 4+ years experience with ETL development, data engineering and data quality assurance. Good Experience on Snowflake and its features. Hands on experience as Data Engineering in Cloud Data Solutions using Snowflake . Experienced working with Cloud Platform Services (AWS/Azure/GCP) . Experienced in ETL/ETL technologies like Talend/DBT or other ETL platforms . Experience in preparing and reviewing new data flows patterns. Excellent Python Skills Strong RDBMS concepts and SQL development skills Strong focus on data pipelines automation Exposure in quality assurance and data quality activities are an added advantage. DevOps/ DataOps experience (especially Data operations preferred) Readiness to work with multiple tech domains and streams Passionate about new technologies and experimentation Experience with Inmuta and Montecarlo is a plus What you get: Good and stable working environment with attractive compensation and rewards package (according to local regulations) Annual bonus payment based on performance Access to various internal and external training platforms (e.g. Linkedin Learning) Experienced and professional colleagues and workplace that supports innovation Multiple Savings Plans with Employer Match Company's emphasis on employees wellness and work-life balance ( (e.g. generous vacation days and OneRoche Wellness Days ), Workplace flexibility policy State of art working environment and facilities And many more that the Talent Acquisition Partner will be happy to talk about! Who we are A healthier future drives us to innovate. Together, more than 100'000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let's build a healthier future, together. Roche is an Equal Opportunity Employer.
Posted 1 month ago
5.0 - 10.0 years
5 - 10 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Role & responsibilities Design, develop, and optimize scalable data pipelines for ETL/ELT processes. Develop and maintain Python-based data processing scripts and automation tools. Write and optimize complex SQL queries (preferably in Snowflake) for data transformation and analytics. Experience with Jenkins or other CI/CD tools. Experience developing with Snowflake as the data platform. Experience with ETL/ELT tools (preferably Fivetran, dbt). Implement version control best practices using Git or other tools to manage code changes. Collaborate with cross-functional teams (analysts, product managers, and engineers) to understand business needs and translate them into technical data solutions. Ensure data integrity, security, and governance across multiple data sources. Optimize query performance and database architecture for efficiency and scalability. Lead troubleshooting and debugging efforts for data-related issues. Document data workflows, architectures, and best practices to ensure maintainability and knowledge sharing. Preferred candidate profile 5+ years of experience in Data Engineering, Software Engineering, or a related field. Bachelors or masters degree in computer science, Computer Engineering, or a related discipline High proficiency in SQL (preferably Snowflake) for data modeling, performance tuning, and optimization. Strong expertise in Python for data processing and automation. Experience with Git or other version control tools in a collaborative development environment. Strong communication skills and ability to collaborate with cross-functional teams for requirements gathering and solution design. Experience working with large-scale, distributed data systems and cloud data warehouse.
Posted 1 month ago
5.0 - 10.0 years
5 - 10 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
EagleView, the leader in aerial imagery, is hiring an Analytics and AI Development Lead in the Enterprise Data & Analytics team to help with implementing the organization's data and analytics strategy. This role entails leading a team of developers and data engineers overseeing the processes of AI tool development, automation and analytics, working closely with various departments across the US and India to ensure that data-driven insights are successfully integrated into business operations. The ideal candidate should possess a robust analytical background, exceptional leadership abilities, and a passion for utilizing data to address complex business challenges. This role is ideal for an experienced manager with strong experience in team management, Python, AWS, SQL (preferably Snowflake), Git and Jenkins. You will help build and manage our AI operations platform, helping deliver insights to a variety of stakeholders across all departments. Responsibilities Oversee and lead development for AI asset design, with a focus on efficiency Data governance to support quality AI, security, and compliance best practices Collaborate with cross-functional teams (analysts, product managers, and engineers) to understand business needs and translate them into technical solutions Lead team project delivery and communication to leadership Debug and optimize complex Python and SQL (preferably in Snowflake) Lead troubleshooting and debugging efforts for data pipelines and data quality issues Deliver high quality analytics for executive leadership on occasion Qualifications 5+ years of experience in lead role for a team within any of the Data Engineering, software engineering or Artificial Intelligence domains Bachelor's or Master's degree in Computer Science, Computer Engineering, or a related discipline Experience developing with Snowflake as the data platform Strong communication skills and ability to collaborate with cross-functional teams for requirements gathering and solution design Experience working with large-scale, distributed data systems and cloud data warehouses Understanding of data modeling principles and database design High proficiency in SQL (preferably Snowflake) for data modeling, performance tuning, and optimization High proficiency in Python, particularly for building data sets (including ingestion and transformation), ideally with production level projects using data science and AI libraries (including LLMs, semantic models or predictive analytics) Preferred Experience: Design, develop, and optimize scalable data pipelines for ETL/ELT processes Experience with ETL/ELT tools (preferably Fivetran, dbt) Experience with Git or other version control tools in a collaborative development environment Develop and maintain Python-based data processing scripts and automation tools Administration of big data platforms, such as Snowflake Experience with Microsoft PowerBI and other visualization tools Knowledge of cloud platforms (AWS, GCP, Azure) and related services (e.g., S3, Lambda, BigQuery) Experience in AWS, particularly for building and deploying AI solutions (ideally using Bedrock or OpenAI)
Posted 1 month ago
7.0 - 12.0 years
7 - 11 Lacs
Delhi, India
On-site
Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management
Posted 1 month ago
7.0 - 12.0 years
7 - 11 Lacs
Pune, Maharashtra, India
On-site
Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management
Posted 1 month ago
5.0 - 10.0 years
8 - 18 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Total Experience:5+ Years Location :Hyderabad, Bangalore Work Mode: Hybrid Responsibilities: Translate business requirements into technical requirements as needed. Design and develop automated scripts for data pipelines to process and transform as per the requirements and monitor those. Produce artifacts such as data flow diagrams, designs, data model along with git code as deliverable. Use tools or programming languages such as SQL, Snowflake, Airflow, dbt, Salesforce Data cloud. Ensure data accuracy, timeliness, and reliability throughout the pipeline. Complete QA, data profiling to ensure data is ready as per the requirements for UAT. Collaborate with stakeholders on business, Visualization team and support enhancements. Timely updates on the sprint boards, task updates. Team lead to provide timely project updates on all the projects. Project experience with version control systems and CICD such as GIT, GitFlow, Bitbucket, Jenkins etc. Participate in UAT to resolve findings and plan Go Live/Production deployment.
Posted 1 month ago
8.0 - 13.0 years
7 - 13 Lacs
Chennai, Tamil Nadu, India
Remote
We are seeking a highly skilled and experienced Data Architect with strong expertise in data modeling and Snowflake to design, develop, and optimize enterprise data architecture. The ideal candidate will play a critical role in shaping data strategy, building scalable models, and ensuring efficient data integration and governance. Key Responsibilities: Design and implement end-to-end data architecture using Snowflake Develop and maintain conceptual, logical, and physical data models. Define and enforce data architecture standards, best practices, and policies. Collaborate with data engineers, analysts, and business stakeholders to gather requirements and design data solutions. Optimize Snowflake performance including data partitioning, caching, and query tuning. Create and manage data dictionaries, metadata, and lineage documentation. Ensure data quality, consistency, and security across all data platforms. Support data integration from various sources (cloud/on-premises) into Snowflake. Required Skills and Experience: 8+ years of experience in data architecture, data modeling, or similar roles. Hands-on expertise with Snowflake including Snowpipe, Streams, Tasks, and Secure Data Sharing. Strong experience with data modeling tools (e.g., Erwin, ER/Studio, dbt). Proficiency in SQL , ETL/ELT pipelines , and data warehousing concepts . Experience working with structured, semi-structured (JSON, XML), and unstructured data. Solid understanding of data governance, data cataloging, and security frameworks. Excellent analytical, communication, and stakeholder management skills. Preferred Qualifications: Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with data lakehouse architecture and real-time data processing. Snowflake Certification(s) or relevant cloud certifications. Knowledge of Python or scripting for data automation is a plus.
Posted 1 month ago
8.0 - 13.0 years
7 - 13 Lacs
Bengaluru / Bangalore, Karnataka, India
Remote
We are seeking a highly skilled and experienced Data Architect with strong expertise in data modeling and Snowflake to design, develop, and optimize enterprise data architecture. The ideal candidate will play a critical role in shaping data strategy, building scalable models, and ensuring efficient data integration and governance. Key Responsibilities: Design and implement end-to-end data architecture using Snowflake Develop and maintain conceptual, logical, and physical data models. Define and enforce data architecture standards, best practices, and policies. Collaborate with data engineers, analysts, and business stakeholders to gather requirements and design data solutions. Optimize Snowflake performance including data partitioning, caching, and query tuning. Create and manage data dictionaries, metadata, and lineage documentation. Ensure data quality, consistency, and security across all data platforms. Support data integration from various sources (cloud/on-premises) into Snowflake. Required Skills and Experience: 8+ years of experience in data architecture, data modeling, or similar roles. Hands-on expertise with Snowflake including Snowpipe, Streams, Tasks, and Secure Data Sharing. Strong experience with data modeling tools (e.g., Erwin, ER/Studio, dbt). Proficiency in SQL , ETL/ELT pipelines , and data warehousing concepts . Experience working with structured, semi-structured (JSON, XML), and unstructured data. Solid understanding of data governance, data cataloging, and security frameworks. Excellent analytical, communication, and stakeholder management skills. Preferred Qualifications: Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with data lakehouse architecture and real-time data processing. Snowflake Certification(s) or relevant cloud certifications. Knowledge of Python or scripting for data automation is a plus.
Posted 1 month ago
1.0 - 4.0 years
1 - 4 Lacs
Pune, Maharashtra, India
On-site
Duties Responsibilities: Collaborate with cross-functional teams to understand business requirements and translate them into data integration solutions. Develop and maintain ETL/ELT pipelines using modern tools like Informatica IDMC to connect source systems to Snowflake. Ensure data accuracy, consistency, and security in all integration workflows. Monitor, troubleshoot, and optimize data integration processes to meet performance and scalability goals. Support ongoing integration projects, including Salesforce and SAP data pipelines, while adhering to best practices in data governance. Document integration designs, workflows, and operational processes for effective knowledge sharing. Assist in implementing and improving data quality controls at the start of processes to ensure reliable outcomes. Stay informed about the latest developments in integration technologies and contribute to team learning and improvement. Qualifications: Required Skills and Experience: 5+ years of hands-on experience in data integration, ETL/ELT development, or data engineering. Proficiency in SQL and experience working with relational databases such as Snowflake, PostgreSQL, or SQL Server. Familiarity with data integration tools such as FiveTran, Informatica Intelligent Data Management Cloud (IDMC), or similar platforms. Basic understanding of cloud platforms like AWS, Azure, or GCP. Experience working with structured and unstructured data in varying formats (e.g., JSON, XML, CSV). Strong problem-solving skills and the ability to troubleshoot data integration issues effectively. Excellent verbal and written communication skills, with the ability to document technical solutions clearly. Preferred Skills and Experience: Exposure to integrating business systems such as Salesforce or SAP into data platforms. Knowledge of data warehousing concepts and hands-on experience with Snowflake. Familiarity with APIs, event-driven pipelines, and automation workflows. Understanding of data governance principles and data quality best practices. Education: Bachelor s degree in Computer Science, Data Engineering, or a related field, or equivalent practical experience. What We Offer: A collaborative and mission-driven work environment at the forefront of EdTech innovation. Opportunities for growth, learning, and professional development. Competitive salary and benefits package, including support for certifications like Snowflake SnowPro Core and Informatica Cloud certifications.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane